Python Job: Data Test Engineer (SDET)

Job added on

Company

Geomotiv
Portugal

Location

Remote Position
(From Everywhere/No Office Location)

Job type

Full-Time

Python Job Details

ABOUT THE PROJECT

It is a leading free streaming television service in America, delivering 100+ live and original channels and thousands of on-demand movies in partnership with major TV networks, movie studios, publishers, and digital media companies. Millions of viewers tune in each month to watch premium news, TV shows, movies, sports, lifestyle, and trending digital series. The service is available on all mobile, web, and connected TV streaming devices.
SUMMARY
At this project, we approach testing differently — we are testing, and breaking, code constantly, but we help rebuild it better. Data Test Engineers (SDET) test and verify applications developed under data pipelines using Java programming language & Apache Kafka and work closely with the data development teams to validate events and analytics. This role is a DTE with a focus on data application validation. In this role you will apply your SDET and SQL experience to verify multiple data applications and tools. The DTE will work with Business Intelligence analysts and developers to make sure data and application quality and integrity is maintained.The position requires strong SDET experience with a focus on data, knowledge of data pipelines from raw data to reporting, and demonstrable SQL skills. The DTE will also act as a representative of the Software Test Engineering Team in scrum meetings, and work alongside product management and development teams to address how to provide better quality coverage for the applications supported.
WHAT YOU’LL BE UP TO:

  • Work with project development teams implementing analytics features into client applications;
  • Design, develop manual and automated test cases in order to validate new or existing data integration solutions using Java to meet data pipeline business requirements;
  • Verify applications and tools developed on data, data warehousing & AWS Redshift , Snowflake or columnar databases;
  • Develop best practices for data integration/ streaming;
  • Design and develop data integration/ engineering workflows on Big Data technologies and platforms;
  • Verify capturing of analytics events in related file systems or databases through SQL, or a scripting language (Python, Java, shell scripting, etc.);
  • Work with Business Intelligence and Product Management to create test strategies, plans and cases that provide acceptable coverage for a given data pipeline, from event creation to reporting;
  • Work in an Agile Software Delivery methodology, highly focused in creating data validation tests based on requirements.

WHAT YOU’LL NEED TO HAVE:

  • 5+years of Quality Assurance/Testing experience;
  • Obligatory excellent SQL knowledge (aggregation, windows functions);
  • Experience in Java, Maven, TestNG, JDBC, WireMock, Git;
  • Experience in building up a test automation framework;
  • Some engineering experience and practice in Data Management, Data Quality verification/Data Governance, Data Integration;
  • Good understanding of data pipelines, Data Lakes, ETL testing;
  • Experience in AWS, Snowflake, Kafka (basic knowledge);
  • Understanding of Big Data principles;
  • Experience in Data analysis & requirements validation;
  • Concrete experience in Data project Test Planning, Test Case design, Test Result Reporting;
  • Knowledge of CI/CD principles and best practices in data processing – Jenkins.

Job Type: Full-time

Salary: 55,182.56€ - 128,672.41€ per year